Goto

Collaborating Authors

 noisy label sample


FedFixer: Mitigating Heterogeneous Label Noise in Federated Learning

Ji, Xinyuan, Zhu, Zhaowei, Xi, Wei, Gadyatskaya, Olga, Song, Zilong, Cai, Yong, Liu, Yang

arXiv.org Artificial Intelligence

Federated Learning (FL) heavily depends on label quality for its performance. However, the label distribution among individual clients is always both noisy and heterogeneous. The high loss incurred by client-specific samples in heterogeneous label noise poses challenges for distinguishing between client-specific and noisy label samples, impacting the effectiveness of existing label noise learning approaches. To tackle this issue, we propose FedFixer, where the personalized model is introduced to cooperate with the global model to effectively select clean client-specific samples. In the dual models, updating the personalized model solely at a local level can lead to overfitting on noisy data due to limited samples, consequently affecting both the local and global models' performance. To mitigate overfitting, we address this concern from two perspectives. Firstly, we employ a confidence regularizer to alleviate the impact of unconfident predictions caused by label noise. Secondly, a distance regularizer is implemented to constrain the disparity between the personalized and global models. We validate the effectiveness of FedFixer through extensive experiments on benchmark datasets. The results demonstrate that FedFixer can perform well in filtering noisy label samples on different clients, especially in highly heterogeneous label noise scenarios.


One-Step Abductive Multi-Target Learning with Diverse Noisy Label Samples

Yang, Yongquan

arXiv.org Artificial Intelligence

One-step abductive multi-target learning (OSAMTL) [1] was proposed to alleviate the situation where it is often difficult or even impossible for experts to manually achieve the accurate ground-truth labels, which leads to labels with complex noisy for a specific learning task. With a H. pylori segmentation task of medical histopathology whole slide images [1,2], OSAMTL has been shown to possess significant potentials in handling complex noisy labels, using logical rationality evaluations based on logical assessment formula (LAF) [1,3]. However, OSAMTL is not suitable for the situation of learning with diverse noisy label samples. In this paper, we aim to address this issue. Firstly, we give definition of diverse noisy label samples (DNLS). Secondly, based on the given definition of DNLS, we propose one-step abductive multi-target learning with DNLS (OSAMTL-DNLS). Finally, we provide analyses of OSAMTL-DNLS compared with the original OSAMTL.

  Country: Asia > China (0.04)
  Genre: Research Report (0.51)